🇬🇧 en it 🇮🇹

Wild West properNoun

  • (historical) The western United States during the 19th-century era of settlement, commonly believed to be lawless and unruly.
Far West
Wiktionary Links